Regularization Method of Learning Data Sets for Inverse Analyses Using Neural Networks.
نویسندگان
چکیده
منابع مشابه
A regularization method for solving a nonlinear backward inverse heat conduction problem using discrete mollification method
The present essay scrutinizes the application of discrete mollification as a filtering procedure to solve a nonlinear backward inverse heat conduction problem in one dimensional space. These problems are seriously ill-posed. So, we combine discrete mollification and space marching method to address the ill-posedness of the proposed problem. Moreover, a proof of stability and<b...
متن کاملRegularization Learning of Neural Networks for Generalization
In this paper, we propose a learning method of neural networks based on the regularization method and analyze its generalization capability. In learning from examples, training samples are independently drawn from some unknown probability distribution. The goal of learning is minimizing the expected risk for future test samples, which are also drawn from the same distribution. The problem can b...
متن کاملrodbar dam slope stability analysis using neural networks
در این تحقیق شبکه عصبی مصنوعی برای پیش بینی مقادیر ضریب اطمینان و فاکتور ایمنی بحرانی سدهای خاکی ناهمگن ضمن در نظر گرفتن تاثیر نیروی اینرسی زلزله ارائه شده است. ورودی های مدل شامل ارتفاع سد و زاویه شیب بالا دست، ضریب زلزله، ارتفاع آب، پارامترهای مقاومتی هسته و پوسته و خروجی های آن شامل ضریب اطمینان می شود. مهمترین پارامتر مورد نظر در تحلیل پایداری شیب، بدست آوردن فاکتور ایمنی است. در این تحقیق ...
Regularization for Neural Networks
Research into regularization techniques is motivated by the tendency of neural networks to to learn the specifics of the dataset it was trained on rather than learning general features that are applicable to unseen data. This is known as overfitting. The goal of any supervised machine learning task is to approximate a function that maps inputs to outputs, given a dataset of examples and labels....
متن کاملLearning Compact Neural Networks with Regularization
We study the impact of regularization for learning neural networks. Our goal is speeding up training, improving generalization performance, and training compact models that are cost efficient. Our results apply to weight-sharing (e.g. convolutional), sparsity (i.e. pruning), and low-rank constraints among others. We first introduce covering dimension of the constraint set and provide a Rademach...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: TRANSACTIONS OF THE JAPAN SOCIETY OF MECHANICAL ENGINEERS Series C
سال: 1994
ISSN: 0387-5024,1884-8354
DOI: 10.1299/kikaic.60.4260